Web Survey Bibliography
Theoretical background. Since approximately ten years, telephone surveys in Germany belong to the most frequently used survey modes in the social sciences, which is especially due to its low costs compared with e.g. face
‐to‐face surveys and its high flexibility concerning organization and conduct (ADM 2008). However, these advantages are increasingly opposed by declining response rates in academic as well as commercial surveys. Over time, response rates have steadily diminished, whereas different sources partly supply highly diverse figures: some report on response rates below 40 percent, some on rates around 70 percent (cf. Berinsky 2008, 309f; De Leeuw & De Heer 2002; Schnell, Hill & Esser 1999, 286ff.). The key explanations are non‐contact and refusals (Berinsky 2008, 310; Lavrakas 2008, 252). Our own surveys reveal that non‐contact as well as refusal rates rose within the last five years about 5‐10 percentage points, whereby the rates differ from survey to survey. While in September 2003 29 percent of all households could not be reached, in May 2008 that was the case regarding 44 percent of the households. And while in September 2003 31 percent of the interviewees refused, 43 percent did so in February 2008. Literature suggests numerous explanations for both phenomena. Non‐contact occurs e.g. because of increased mobility or the absence of a landline telephone in households (cf. Pew 2008). In Germany, around 5‐7 percent of the population do not have landline telephones but only cell phones (cf. Hunsicker & Schroth 2007). If this trend continues, telephone surveys on the basis of landlines cannot be regarded as being representative anymore. Refusals occur e.g. because of over‐surveying, insufficient trust in survey institutes or lack of time (vgl. Schnauber & Daschmann 2008). Against this background, the question arises whether telephone surveys are still feasible in the long run. First, it is possible that the non‐response is systematic and biases the survey results, which hence (as mentioned above) cannot be interpreted as being representative anymore (Schnauber & Daschmann 2008, 98). Second, the resolution of the problems or the change to another survey mode (e.g. inclusion of cell phones) would trigger high additional costs. Nevertheless, there are some approaches to meet these obstacles. By modifying e.g. the call time or the number of call attempts, the number of contacts could be improved. By altering the introductory phrases and by enhancing the professionalism of the interviewers, e.g. by extensive training, one could try to reduce refusal rates (cf. Meier, Schneid, Stegemann & Stiegler 2005). ‐term telephone surveys conducted at our department yield a huge data basis to answer the questions deduced above. On the one hand, we aim to analyze the difficulty of decreasing response rates and hence the decreasing quality of the surveys. On the other hand, we pursue the question as to whether the theoretically derived explanations for non‐contact and refusals can be confirmed by our empirical data. In summary, we try to answer the question as to whether telephone survey will still be feasible in the future or not.
Method and conduct. Long
Conference homepage (abstract)
Web survey bibliography (281)
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- Standard Definitions Final Dispositions of Case Codes and Outcome Rates for Surveys; 2016
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Pitfalls, Potentials, and Ethics of Online Survey Research: LGBTQ and Other Marginalized and Hard-to...; 2016; McInroy, L. B.
- Computer-assisted and online data collection in general population surveys; 2016; Skarupova, K.
- A Statistical Approach to Provide Individualized Privacy for Surveys; 2016; Esponda, F.; Huerta, K.; Guerrero, V. M.
- Social Media Analyses for Social Measurement; 2016; Schober, M. F.; Pasek, J.; Guggenheim, L.; Lampe, C.; Conrad, F. G.
- Doing Surveys Online ; 2016; Toepoel, V.
- An Overview of Mobile CATI Issues in Europe; 2015; Slavec, A.; Toninelli, D.
- Utilizing iPads in the Field; 2015; Kiser, P.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2015; 2015
- The Web Survey Revolution ; 2015; Murray, D.
- Methodology of the RAND Mid-Term 2014 Election Panel; 2015; Carman, K. G; Pollack, S.
- 28 Questions to Help Buyers of Online Samples; 2015; Cape, P. J.; Phillips, A.; Baker, R.; Cooke, M.; Ribeiro, E.; Terhanian, G.
- Ethical decision-making and Internet research 2.0: Recommendations from the AoIR ethics working committee...; 2015; Markham, A.; Buchanan, E. A.
- Doing online research involving university students with disabilities: Methodological issues; 2015; De Cesarei, A.; Baldaro, B.
- Exploring ethical issues associated with using online surveys in educational research; 2015; Roberts, L. D.; Allen, P. J.
- An Introduction to Survey Research; 2015; Cowles, E. L.; Nelson, E.
- Ethical issues in online research; 2015; James, N.; Busher, H.
- Leading Edge Insights: Foundations of Quality 2.0; 2014; Fuguitt, G.
- Methods and systems for managing an online opinion survey service; 2014; Mcloughlin, M. H., Seton, N., Blesy, K.
- Recent Books and Journals in Public Opinion, Survey Methods, and Survey Statistics; 2014; Callegaro, M.
- Undisclosed Privacy: The Effect of Privacy Rights Design on Response Rates; 2014; Haer, R., Meidert, N.
- Tailoring mode of data collection in longitudinal studies; 2013; Kaminska, O., Lynn, P.
- How do we Know Cognitive Interviewing is Any Good?; 2013; Willis, G. B.
- Quality of Web surveys; 2013; Revilla, M.
- Experiments in Obtaining Data Linkage Consent in Web Surveys ; 2013; Sakshaug, J. W., Kreuter, F.
- Response Burden in Official Business Surveys: Measurement and Reduction Practices of National Statistical...; 2013; Giesen, D., Bavdaz, M., Loefgren, T., Raymond-Blaess, V.
- Internet as a new source of information for the production of official statistics. Experiences of Statistics...; 2013; Heerschap, N.
- A standard with quality indicators for web panel surveys: a Swedish example; 2013; Nyfjaell, M.
- How Mobile Stacks Up to Traditional Online: A Comparison of Studies; 2013; Knowles, R.
- How to make your questionnaire mobile-ready; 2013; Cape, P. J.
- Phish Rising: How Internet Criminals are Undermining the Viability of Online Survey Research…and...; 2013; Kunovic, K.
- Self-Reported Participation in Research Practices Among Survey Methodology Researchers; 2013; Perez-Vergara, K., Smith, C., Lowenstein, C., Ozonoff, A., Martins, Y.
- Ethics, privacy and data security in web-based course evaluation; 2013; Salaschek, M., Meese, C., Thielsch, M.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Code Comparison; 2012
- Evaluation procedures for Survey questions; 2012; Saris, W. E.
- Transparency, Access and the Credibility of Survey Research; 2012; Lupia, A.
- Anonymity and Confidentiality; 2012; Tourangeau, R.
- Cognitive Evaluation of Survey Instruments: State of the Science (Art?) and Future Directions; 2012; Willis, G. B.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- Comparability of Survey Measurements; 2012; Oberski, D.
- Classification of Surveys; 2012; Stoop, I., Harrison, E.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- Why one should incorporate the design weights when adjusting for unit nonresponse using response homogeneity...; 2012; Kott, P. S.
- Assessing the Quality of Survey Data ; 2012; Blasius, J.
- Designing and Doing Survey Research; 2012; Andres, L.
- Using break-offs in web interviews for predicting web response in mixed mode surveys; 2011; Beukenhorst, D.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2011; 2011